Fast Iterated Bootstrap Mean Bias Correction
نویسنده
چکیده
Abstract: The article proposes a computationally efficient procedure for bias adjustment in the iterated bootstrap. The new technique replaces the need for successive levels of bootstrap resampling by proposing an approximation for the double bootstrap “calibrating coefficient” using only one draw from the second level probability distribution. Extensive Monte Carlo evidence suggest that the proposed approximation performs better than the ordinary bootstrap bias correction. The article evaluates the usefulness of the bootstrap and fast bootstrap in reducing the bias of generalized method of moments estimators under weak instruments. In identified models, this fast bootstrap bias correction leads to estimators with lower variance than those based on the double bootstrap. The proposed fast iterated bootstrap performs better than the double bootstrap in all scenarios and especially when the model has the weakest instrument relevance and the highest degree of endogeneity. However, when the estimators have no finite moments and the instruments are weak, the bootstrap does not work well and iterating it makes things worse.
منابع مشابه
Computationally efficient approximation for the double bootstrap mean bias correction
We propose a computationally efficient approximation for the double bootstrap bias adjustment factor without using the inner bootstrap loop. The approximation converges in probability to the population bias correction factor. We study the finite sample properties of the approximation in the context of a linear instrumental variable model. In identified versions of the model considered in our Mo...
متن کاملAn Asymptotic Analysis of the Bootstrap Bias Correction for the Empirical Cte
The -level Conditional Tail Expectation (CTE) of a continuous random variable X is defined as its conditional expectation given the event {X q }, where q represents its -level quantile. It is well known that the empirical CTE (the average of the n(1 ) largest order statistics in a sample of size n) is a negatively biased estimator of the CTE. This bias vanishes as the sample size increases but ...
متن کاملBias - Corrected Maximum Likelihood Estimation of the Parameters of the Generalized Pareto Distribution
We derive analytic expressions for the biases, to O(n), of the maximum likelihood estimators of the parameters of the generalized Pareto distribution. Using these expressions to bias-correct the estimators in a selective manner is found to be extremely effective in terms of bias reduction, and can also result in a small reduction in relative mean squared error. In terms of remaining relative bi...
متن کاملBias Correction with Jackknife, Bootstrap, and Taylor Series
We analyze the bias correction methods using jackknife, bootstrap, and Taylor series. We focus on the binomial model, and consider the problem of bias correction for estimating f(p), where f ∈ C[0, 1] is arbitrary. We characterize the supremum norm of the bias of general jackknife and bootstrap estimators for any continuous functions, and demonstrate the in deleted jackknife, different values o...
متن کاملParametric bootstrap methods for bias correction in linear mixed models
The empirical best linear unbiased predictor (EBLUP) in the linear mixed model (LMM) is useful for the small area estimation, and the estimation of the mean squared error (MSE) of EBLUP is important as a measure of uncertainty of EBLUP. To obtain a second-order unbiased estimator of the MSE, the second-order bias correction has been derived mainly based on Taylor series expansions. However, thi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009